We report the frequency response of Al0.3InAsSb/Al0.7InAsSb nBn photodetectors. The 3-dB bandwidth of the devices varies from ∼ 150 MHz to ∼ 700 MHz with different device diameters and saturates with bias voltage immediately after the device turn on. A new equivalent circuit model is developed to explain the frequency behavior of nBn photodetectors. The simulated bandwidth based on the new equivalent circuit model agrees well with the bandwidth and the microwave scattering parameter measurements. The analysis reveals that the limiting factor of the bandwidth of the nBn photodetector is the large diffusion capacitance caused by the minority carrier lifetime and the device area. Additionally, the bandwidth of the nBn photodetector is barely affected by the photocurrent, which is found to be caused by the barrier structure in the nBn photodetector.
Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Frequency behavior of AlInAsSb nBn photodetectors and the development of an equivalent circuit model
-
3D sensing is a primitive function that allows imaging with depth information generally achieved via the time‐of‐flight (ToF) principle. However, time‐to‐digital converters (TDCs) in conventional ToF sensors are usually bulky, complex, and exhibit large delay and power loss. To overcome these issues, a resistive time‐of‐flight (R‐ToF) sensor that can measure the depth information in an analog domain by mimicking the biological process of spike‐timing‐dependent plasticity (STDP) is proposed herein. The R‐ToF sensors based on integrated avalanche photodiodes (APDs) with memristive intelligent matters achieve a scan depth of up to 55 cm (≈89% accuracy and 2.93 cm standard deviation) and low power consumption (0.5 nJ/step) without TDCs. The in‐depth computing is realized via R‐ToF 3D imaging and memristive classification. This R‐ToF system opens a new pathway for miniaturized and energy‐efficient neuromorphic vision engineering that can be harnessed in light‐detection and ranging (LiDAR), automotive vehicles, biomedical in vivo imaging, and augmented/virtual reality.